A proximal subgradient algorithm with extrapolation for structured nonconvex nonsmooth problems

نویسندگان

چکیده

Abstract In this paper, we consider a class of structured nonconvex nonsmooth optimization problems, in which the objective function is formed by sum possibly and differentiable with Lipschitz continuous gradient, subtracted weakly convex function. This general framework allows us to tackle problems involving loss functions specific constraints, it has many applications such as signal recovery, compressed sensing, optimal power flow distribution. We develop proximal subgradient algorithm extrapolation for solving these guaranteed subsequential convergence stationary point. The whole sequence generated our also established under widely used Kurdyka–Łojasiewicz property. To illustrate promising numerical performance proposed algorithm, conduct experiments on two important models. These include sensing problem regularization an distributed energy resources.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems

In this paper, we study the proximal gradient algorithm with extrapolation for minimizing the sum of a Lipschitz differentiable function and a proper closed convex function. Under the error bound condition used in [19] for analyzing the convergence of the proximal gradient algorithm, we show that there exists a threshold such that if the extrapolation coefficients are chosen below this threshol...

متن کامل

Benson's algorithm for nonconvex multiobjective problems via nonsmooth Wolfe duality

‎In this paper‎, ‎we propose an algorithm to obtain an approximation set of the (weakly) nondominated points of nonsmooth multiobjective optimization problems with equality and inequality constraints‎. ‎We use an extension of the Wolfe duality to construct the separating hyperplane in Benson's outer algorithm for multiobjective programming problems with subdifferentiable functions‎. ‎We also fo...

متن کامل

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

In this paper, we introduce a stochastic projected subgradient method for weakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions—a wide class of functions which includes the additive and convex composite classes. At a high-level, the method is an inexact proximal point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochast...

متن کامل

An approximate subgradient algorithm for unconstrained nonsmooth, nonconvex optimization

In this paper a new algorithm for minimizing locally Lipschitz functions is developed. Descent directions in this algorithm are computed by solving a system of linear inequalities. The convergence of the algorithm is proved for quasidifferentiable semismooth functions. We present the results of numerical experiments with both regular and nonregular objective functions. We also compare the propo...

متن کامل

Proximal alternating linearized minimization for nonconvex and nonsmooth problems

We introduce a proximal alternating linearized minimization (PALM) algorithm for solving a broad class of nonconvex and nonsmooth minimization problems. Building on the powerful KurdykaLojasiewicz property, we derive a self-contained convergence analysis framework and establish that each bounded sequence generated by PALM globally converges to a critical point. Our approach allows to analyze va...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerical Algorithms

سال: 2023

ISSN: ['1017-1398', '1572-9265']

DOI: https://doi.org/10.1007/s11075-023-01554-5